High-Capacity Quantum Associative Memories
نویسندگان
چکیده
We review our models of quantum associative memories that represent the “quantization” of fully coupled neural networks like the Hopfield model. The idea is to replace the classical irreversible attractor dynamics driven by an Ising model with pattern-dependent weights by the reversible rotation of an input quantum state onto an output quantum state consisting of a linear superposition with probability amplitudes peaked on the stored pattern closest to the input in Hamming distance, resulting in a high probability of measuring a memory pattern very similar to the input. The unitary operator implementing this transformation can be formulated as a sequence of onequbit and two-qubit elementary quantum gates and is thus the exponential of an ordered quantum Ising model with sequential operations and with pattern-dependent interactions, exactly as in the classical case. Probabilistic quantum memories, that make use of postselection of the measurement result of control qubits, overcome the famed linear storage limitation of their classical counterparts because they permit to completely eliminate crosstalk and spurious memories. The number of control qubits plays the role of an inverse fictitious temperature, the accuracy of pattern retrieval can be tuned by lowering the fictitious temperature under a critical value for quantum content association while the complexity of the retrieval algorithm remains polynomial for any number of patterns polynomial in the number of qubits. These models solve thus the capacity shortage problem of classical associative memories, providing a polynomial improvement in capacity. The price to pay is the probabilistic nature of information retrieval. PACS numbers: 03.67.-a, 03.67.Lx, 64.70.Tg ∗Electronic address: [email protected] †Electronic address: [email protected]
منابع مشابه
A Quantum Associative Memory Based on Grover’s Algorithm
Quantum computation uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts. The unique characteristics of quantum theory may also be used to create a quantum associative memory with a capacity exponential in the number of neurons. This paper combines two quantum computational al...
متن کاملQuantum associative memory
This paper combines quantum computation with classical neural network theory to produce a quantum computational learning algorithm. Quantum computation uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts. The unique characteristics of quantum theory may also be used to create...
متن کاملProbabilistic quantum memories.
Typical address-oriented computer memories cannot recognize incomplete or noisy information. Associative (content-addressable) memories solve this problem but suffer from severe capacity shortages. I propose a model of a quantum memory that solves both problems. The storage capacity is exponential in the number of qbits and thus optimal. The retrieval mechanism for incomplete or noisy inputs is...
متن کاملArtificial Associative Memory Using Quantum Processes
This paper discusses an approach to constructing an artificial quantum associative memory (QuAM). The QuAM makes use of two quantum computational algorithms, one for pattern storage and the other for pattern recall. The result is an exponential increase in the capacity of the memory when compared to traditional associative memories such as the Hopfield network. Further, the paper argues for con...
متن کاملHigh capacity recurrent associative memories
Various algorithms for constructing weight matrices for Hopfield-type associative memories are reviewed, including ones with much higher capacity than the basic model. These alternative algorithms either iteratively approximate the projection weight matrix or use simple perceptron learning. An experimental investigation of the performance of networks trained by these algorithms is presented, in...
متن کامل